Invertible Manifold Learning for Dimension Reduction

نویسندگان

چکیده

It is widely believed that a dimension reduction (DR) process drops information inevitably in most practical scenarios. Thus, methods try to preserve some essential of data after DR, as well manifold based DR methods. However, they usually fail yield satisfying results, especially high-dimensional cases. In the context learning, we think good low-dimensional representation should topological and geometric properties manifolds, which involve exactly entire manifolds. this paper, define problem information-lossless NLDR with assumption propose novel two-stage method, called invertible learning (inv-ML), tackle problem. A local isometry constraint preserving geometry applied under inv-ML. Firstly, homeomorphic sparse coordinate transformation learned find without losing information. Secondly, linear compression performed on coding, trade-off between target incurred loss. Experiments are conducted seven datasets neural network implementation inv-ML, i-ML-Enc, demonstrate proposed inv-ML not only achieves comparison typical existing but also reveals characteristics manifolds through interpolation latent space. Moreover, reliability tangent space approximated by neighborhood real-world key success algorithms. The code will be made available soon.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Distance Preserving Dimension Reduction for Manifold Learning

Manifold learning is an effective methodology for extracting nonlinear structures from high-dimensional data with many applications in image analysis, computer vision, text data analysis and bioinformatics. The focus of this paper is on developing algorithms for reducing the computational complexity of manifold learning algorithms, in particular, we consider the case when the number of features...

متن کامل

Performance Analysis of a Manifold Learning Algorithm in Dimension Reduction

We consider the performance of local tangent space alignment (Zhang and Zha, 2004), one of several manifold learning algorithms which has been proposed as a dimension reduction method, when errors are present in the observations. Matrix perturbation theory is applied to obtain a worst-case upper bound on the deviation of the solution, which is an invariant subspace. Although we only prove this ...

متن کامل

UMAP: Uniform Manifold Approximation and Projection for Dimension Reduction

UMAP (Uniform Manifold Approximation and Projection) is a novel manifold learning technique for dimension reduction. UMAP is constructed from a theoretical framework based in Riemannian geometry and algebraic topology. Œe result is a practical scalable algorithm that applies to real world data. Œe UMAP algorithm is competitive with t-SNE for visualization quality, and arguably preserves more of...

متن کامل

Multilevel Nonlinear Dimensionality Reduction for Manifold Learning

Nonlinear dimensionality reduction techniques for manifold learning, e.g., Isomap, may become exceedingly expensive to carry out for large data sets. This paper explores a multilevel framework with the goal of reducing the cost of unsupervised manifold learning. In addition to savings in computational time, the proposed multilevel technique essentially preserves the geodesic information, and so...

متن کامل

Riemannian Manifold Learning for Nonlinear Dimensionality Reduction

In recent years, nonlinear dimensionality reduction (NLDR) techniques have attracted much attention in visual perception and many other areas of science. We propose an efficient algorithm called Riemannian manifold learning (RML). A Riemannian manifold can be constructed in the form of a simplicial complex, and thus its intrinsic dimension can be reliably estimated. Then the NLDR problem is sol...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Lecture Notes in Computer Science

سال: 2021

ISSN: ['1611-3349', '0302-9743']

DOI: https://doi.org/10.1007/978-3-030-86523-8_43